PyDigger - unearthing stuff about Python


NameVersionSummarydate
flash-attn 2.8.1 Flash Attention: Fast and Memory-Efficient Exact Attention 2025-07-10 05:16:39
causal-conv1d 1.5.0.post8 Causal depthwise conv1d in CUDA, with a PyTorch interface 2024-12-06 09:48:49
quant-matmul 1.2.0 Quantized MatMul in CUDA with a PyTorch interface 2024-03-20 03:44:36
fast-hadamard-transform 1.0.4.post1 Fast Hadamard Transform in CUDA, with a PyTorch interface 2024-02-13 05:49:17
flash-attn-wheels-test 2.0.8.post17 Flash Attention: Fast and Memory-Efficient Exact Attention 2023-08-13 21:27:09
flash-attn-xwyzsn 1.0.7 Flash Attention: Fast and Memory-Efficient Exact Attention 2023-06-01 03:53:40
Tri Dao
hourdayweektotal
102237710548297979
Elapsed time: 1.37085s